Conditional entropy is a measure of the amount of impurity, uncertainty or randomness remaining in a random variable given that another random variable is known.
In the context of classification problems, conditional entropy quantifies the
uncertainty of a target variable
For a binary classification problem, conditional entropy
Where:
Calculating the conditional entropies for the individual categories/values
Where:
It sums over all the possible categories/values of the class attribute
For example, consider the class (
Specifically, for the category "Sunny," we can calculate:
Conditional Entropy calculator
Class 1 | Class 2 | Ratio | ||||
---|---|---|---|---|---|---|